Dimensionality Reduction via Euclidean Distance Embeddings
نویسندگان
چکیده
This report provides a mathematically thorough review and investigation of Metric Multidimensional scaling (MDS) through the analysis of Euclidean distances in input and output spaces. By combining a geometric approach with modern linear algebra and multivariate analysis, Metric MDS is viewed as a Euclidean distance embedding transformation that converts between coordinate and coordinate-free representations of data. In this work we link Mercer kernel functions, data in infinite-dimensional Hilbert space and coordinate-free distance metrics to a finite-dimensional Euclidean representation. We further set a foundation for a principled treatment of non-linear extensions of MDS as optimization programs on kernel matrices and Euclidean distances.
منابع مشابه
Word Re-Embedding via Manifold Dimensionality Retention
Word embeddings seek to recover a Euclidean metric space by mapping words into vectors, starting from words cooccurrences in a corpus. Word embeddings may underestimate the similarity between nearby words, and overestimate it between distant words in the Euclidean metric space. In this paper, we re-embed pre-trained word embeddings with a stage of manifold learning which retains dimensionality....
متن کامل2D Dimensionality Reduction Methods without Loss
In this paper, several two-dimensional extensions of principal component analysis (PCA) and linear discriminant analysis (LDA) techniques has been applied in a lossless dimensionality reduction framework, for face recognition application. In this framework, the benefits of dimensionality reduction were used to improve the performance of its predictive model, which was a support vector machine (...
متن کاملThe Fast Johnson-lindenstrauss Transform
While we omit the proof, we remark that it is constructive. Specifically, A is a linear map consisting of random projections onto subspaces of Rd. These projections can be computed by n matrix multiplications, which take time O(nkd). This is fast enough to make the Johnson-Lindenstrauss transform (JLT) a practical and widespread algorithm for dimensionality reduction, which in turn motivates th...
متن کاملDimension Reduction in the l1 norm
The Johnson-Lindenstrauss Lemma shows that any set of n points in Euclidean space can be mapped linearly down to O((log n)/ǫ) dimensions such that all pairwise distances are distorted by at most 1 + ǫ. We study the following basic question: Does there exist an analogue of the JohnsonLindenstrauss Lemma for the l1 norm? Note that Johnson-Lindenstrauss Lemma gives a linear embedding which is inde...
متن کاملSubspace Least Squares Multidimensional Scaling
Multidimensional Scaling (MDS) is one of the most popular methods for dimensionality reduction and visualization of high dimensional data. Apart from these tasks, it also found applications in the field of geometry processing for the analysis and reconstruction of non-rigid shapes. In this regard, MDS can be thought of as a shape from metric algorithm, consisting of finding a configuration of p...
متن کامل